23 research outputs found
Towards a Data Quality Framework for Heterogeneous Data
yesEvery industry has significant data output as a product of their working process, and with the recent advent of big data mining and integrated data warehousing it is the case for a robust methodology for assessing the quality for sustainable and consistent processing. In this paper a review is conducted on Data Quality (DQ) in multiple domains in order to propose connections between their methodologies. This critical review suggests that within the process of DQ assessment of heterogeneous data sets, not often are they treated as separate types of data in need of an alternate data quality assessment framework. We discuss the need for such a directed DQ framework and the opportunities that are foreseen in this research area and propose to address it through degrees of heterogeneity
Towards a framework for engineering big data: An automotive systems perspective
noDemand for more sophisticated models to meet big data expectations require significant data repository obligations, operating concurrently in higher-level applications. Current models provide only disjointed modelling paradigms. The proposed framework addresses the need for higher-level abstraction, using low-level logic in the form of axioms, from which higher-level functionality is logically derived. The framework facilitates definition and usage of subjective structures across the cyber-physical system domain, and is intended to converge the range of heterogeneous data-driven objects
Recommended from our members
Hybrid Dynamic Modelling of Engine Emissions on Multi-Physics Simulation Platform
YesThis paper introduces a hybrid dynamic modelling approach for the prediction of NOx emissions for a Diesel engine, based on a multi-physics simulation platform coupling a 1-D air path model (GT-Suite) with in-cylinder combustion model (CMCL Stochastic Reactor Model Engine Suite). The key motivation for this research was the requirement to establish a real time stochastic simulation capability for emissions predictions early in engine development, which required the replacement of the slow combustion chemistry solver (SRM) with an appropriate surrogate model. The novelty of the approach in this research is the introduction of a hybrid approach to metamodeling that combines dynamic experiments for the gas path model with a zonal optimal space-filling design of experiments (DoEs) for the combustion model. The dynamic experiments run on the virtual Diesel engine model (GT- Suite) was used to fit a dynamic model for the parameters required as input to the SRM. Optimal Latin Hypercubes (OLH) DoE run on the SRM model was used to fit a response surface model for the NOx emissions. This surrogate NOx model was then used to replace the computationally expensive SRM simulation, enabling real time simulations of transient drive cycles to be executed. The performance of the proposed approach was validated on a simulated NEDC drive cycle against experimental data collected for the engine case study, which proved the capability of methodology to capture the transient trends for the NOx emissions. The significance of this work is that it provided an efficient approach to the development of a global model with real time transient modelling capability based on the integration of dynamic and local DoE metamodeling experiments
Automotive IVHM: Towards Intelligent Personalised Systems Healthcare
YesUnderpinned by a contemporary view of automotive systems as cyber-physical systems, characterised by progressively open architectures increasingly defined by their interaction with the users and the smart environment, this paper provides a critical and up-to-date review of automotive Integrated Vehicle Health Management (IVHM) systems. The paper discusses the challenges with prognostics and intelligent health management of automotive systems, and proposes a high-level framework, referred to as the Automotive Healthcare Analytic Factory, to systematically collect and process heterogeneous data from across the product lifecycle, towards actionable insight for personalised healthcare of systems.Jaguar Land Rover funded research “Intelligent Personalised Powertrain Healthcare” 2016-201
Leupold & Stevens Inc. Position and Competitive Advantage in the Riflescope Industry 2005
In 1947, Leupold created the first riflescope (the Plainsman) that was purged of oxygen and filled with nitrogen to eliminate any chance of internal fogging. Since, Leupold and Stevens (Leupold) as a company became one of America’s optics authorities, familiar to hunters and competitor word wide. With the help of Porter’s five competitive forces analyses will analyze Leupold’s strategic position and the competitive advantage in today’s optics industry
Recommended from our members
Diagnostics and prognostics for complex systems: A review of methods and challenges
YesDiagnostics and prognostics have significant roles in the reliability enhancement of systems and
are focused topics of active research. Engineered systems are becoming more complex and are
subjected to miscellaneous failure modes that impact adversely their performability. This everincreasing
complexity makes fault diagnostics and prognostics challenging for the system-level
functions. A significant number of successes have been achieved and acknowledged in some
review papers; however, these reviews rarely focused on the application of complex engineered
systems nor provided a systematic review of diverse techniques and approaches to address the
related challenges. To bridge the gap, this paper firstly presents a review to systematically cover
the general concepts and recent development of various diagnostics and prognostics approaches,
along with their strengths and shortcomings for the application of diverse engineered systems.
Afterward, given the characteristics of complex systems, the applicability of different techniques
and methods that are capable to address the features of complex systems are reviewed and
discussed, and some of the recent achievements in the literature are introduced. Finally, the
unaddressed challenges are discussed by taking into account the characteristics of automotive
systems as an example of complex systems. In addition, future development and potential research
trends are offered to address those challenges. Consequently, this review provides a systematic
view of the state of the art and case studies with a reference value for scholars and practitioners.The full-text of this article will be released for public view at the end of the publisher embargo on 10 July 2022
Recommended from our members
Integration of Hidden Markov Modelling and Bayesian Network for Fault Detection and Prediction of Complex Engineered Systems
yesThis paper presents a methodology for fault detection, fault prediction and fault isolation based on the
integration of hidden Markov modelling (HMM) and Bayesian networks (BN). This addresses the nonlinear
and non-Gaussian data characteristics to support fault detection and prediction, within an explainable hybrid
framework that captures causality in the complex engineered system. The proposed methodology is based
on the analysis of the pattern of similarity in the log-likelihood (LL) sequences against the training data for
the mixture of Gaussians HMM (MoG-HMM). The BN model identifies the root cause of
detected/predicted faults, using the information propagated from the HMM model as empirical evidence.
The feasibility and effectiveness of the presented approach are discussed in conjunction with the application
to a real-world case study of an automotive exhaust gas Aftertreatment system. The paper details the
implementation of the methodology to this case study, with data available from real-world usage of the
system. The results show that the proposed methodology identifies the fault faster and attributes the fault
to the correct root cause. While the proposed methodology is illustrated with an automotive case study, its
applicability is much wider to the fault detection and prediction problem of any similar complex engineered
system.The full text will be available at the end of the publisher's embargo: 28th May 202
Recommended from our members
A course in statistical engineering
NoA course in statistical engineering has recently been added to the Ford Motor Company's Technical Education Program. The aim was to produce materials suitable for use by Ford but which could also be promoted by the UK's Royal Statistical Society within the university sector. The course is built around a sequence of realistic tasks dictated by the flow of the product creation process. Its structure and content is thus driven by engineering need rather than statistical method, promoting constructivist learning. Before
describing the course content we review the changing role of the engineer and comment on the relationships between Systems
Engineering, Design for Six Sigma and Statistical Engineering. We give details of a case study which plays a crucial role in the course. We focus on some important features of the development process and conclude with a discussion of the approach we have taken and possible future developments
India – Strategic Software Powerhouse and Effects of Outsourcing
This paper discusses about the strategies and issues involved with how India has positioned itself to be a software powerhouse in the world market. The paper reviews the current situation in India and how the govenunent\u27 s actions have helped attract high tech corporations to outsource their software development, call centers and other supported services to India. Next the paper explains what major US corporations are doing to outsource their needs to Indian companies and moving jobs offshore. We then ~arefully observe how this trend is and will affect the US economy and its future. The paper closes with some possible recommendations for resolving the problems, provides discussion questions for further critical thinking and research, and concludes with key learning found in researching this topic